Adjoint Algorithmic Differentiation Tool Support for Typical Numerical Patterns in Computational Finance
نویسنده
چکیده
We demonstrate the flexibility and ease of use of C++ algorithmic differentiation (AD) tools based on overloading to numerical patterns (kernels) arising in computational finance. While adjoint methods and AD have been known in the finance literature for some time, there are few tools capable of handling and integrating with the C++ codes found in production. Adjoint methods are also known to be very powerful but to potentially have infeasible memory requirements. We present several techniques for dealing with this problem and demonstrate them on numerical kernels which occur frequently in finance. We build the discussion around our own AD tool dco/c++ which is designed to handle arbitrary C++ codes and to be highly flexible, however the sketched concepts can certainly be transferred to other AD solutions including in-house tools. An archive of the source code for the numerical kernels as well as all the AD solutions discussed can be downloaded from an accompanying website. This includes documentation for the code and dco/c++. Trial licences for dco/c++ are available from NAG.
منابع مشابه
Adjoint Code Design Patterns
Adjoint methods have become fundamental ingredients of the scientific computing toolbox over the past decades. Large-scale parameter sensitivity analysis, uncertainty quantification and nonlinear optimization as well as modern approaches to deep learning would otherwise turn out computationally infeasible. For nontrivial real-world problems the algorithmic derivation of adjoint numerical simula...
متن کاملAlgorithmic Differentiation of Numerical Methods: Tangent-Linear and Adjoint Solvers for Systems of Nonlinear Equations
We discuss software tool support for the Algorithmic Differentiation (also known as Automatic Differentiation; AD) of numerical simulation programs that contain calls to solvers for parameterized systems of n nonlinear equations. The local computational overhead as well as the additional memory requirement for the computation of directional derivatives or adjoints of the solution of the nonline...
متن کاملLikelihood Ratio Method and Algorithmic Differentiation: Fast Second Order Greeks
We show how Adjoint Algorithmic Differentiation can be combined with the so-called Pathwise Derivative and Likelihood Ratio Method to construct efficient Monte Carlo estimators of second order price sensitivities of derivative portfolios. We demonstrate with a numerical example how the proposed technique can be straightforwardly implemented to greatly reduce the computation time of second order...
متن کاملCalibration in Finance: Very Fast Greeks Through Algorithmic Differentiation and Implicit Function
Adjoint Algorithmic Differentiation is an efficient way to obtain price derivatives with respect to the data inputs. We describe how the process efficiency can be further improved when a model calibration is performed. Using the implicit function theorem, differentiating the numerical process used in calibration is not required. The resulting implementation is more efficient than automatic diff...
متن کاملAlgorithmic Differentiation of Numerical Methods: Tangent-Linear and Adjoint Direct Solvers for Systems of Linear Equations
We consider the Algorithmic Differentiation (also know as Automatic Differentiation; AD) of numerical simulation programs that contain calls to direct solvers for systems of n linear equations. AD of the linear solvers yields a local overhead of O(n) for the computation of directional derivatives or adjoints of the solution vector with respect to the system matrix and right-hand side. The local...
متن کامل